turbocharge ai
JPMorgan's CIO Has Championed A Data Platform That Turbocharges AI
JPMorgan Chase sees artificial intelligence (AI) as critical to its future success. And the mega-bank has a big advantage over many of its smaller rivals: the massive amount of data it gathers from sources such as the 50% of U.S. households with which it has some form of relationship and the $6 trillion worth of payment flows it handles daily. But until recently, identifying and pulling in relevant data to train AI models was taking up around 60% of the time of the bank's growing army of data scientists. That was an inefficient use of an expensive and relatively scarce resource. Now a new data platform the bank has developed, called OmniAI, is helping it to get relevant data into its models much faster.
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.62)
JPMorgan's CIO Has Championed A Data Platform That Turbocharges AI
JPMorgan Chase sees artificial intelligence (AI) as critical to its future success. And the mega-bank has a big advantage over many of its smaller rivals: the massive amount of data it gathers from sources such as the 50% of U.S. households with which it has some form of relationship and the $6 trillion worth of payment flows it handles daily. But until recently, identifying and pulling in relevant data to train AI models was taking up around 60% of the time of the bank's growing army of data scientists. That was an inefficient use of an expensive and relatively scarce resource. Now a new data platform the bank has developed, called OmniAI, is helping it to get relevant data into its models much faster.
- North America > United States > New York (0.06)
- North America > United States > California (0.06)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.62)
Microsoft* Turbocharges AI with Intel FPGAs. You Can, Too.
Today, Microsoft* announced a public preview of Azure Machine Learning Hardware Accelerated Models powered by Project Brainwave*, a new AI inferencing service. The service uses Intel Arria 10 FPGAs, configured as "soft DNN processing units" highly-tuned to the ResNet-50 image recognition model, to provide extraordinary throughput levels. Microsoft calls it "real time AI." One year ago, Microsoft Azure CTO, Mark Russinovich, described their plan to build the Azure Cloud Services infrastructure with an FPGA in every node. Instead of creating node pools with specialized hardware accelerators for the wide-ranging workloads that are deployed in Azure, the Microsoft team went with the flexibility of FPGAs, which can be reconfigured to provide hardware acceleration perfectly aligned to nearly any task.